Search results for "Kolmogorov complexity"

showing 10 items of 16 documents

Textual data compression in computational biology: Algorithmic techniques

2012

Abstract In a recent review [R. Giancarlo, D. Scaturro, F. Utro, Textual data compression in computational biology: a synopsis, Bioinformatics 25 (2009) 1575–1586] the first systematic organization and presentation of the impact of textual data compression for the analysis of biological data has been given. Its main focus was on a systematic presentation of the key areas of bioinformatics and computational biology where compression has been used together with a technical presentation of how well-known notions from information theory have been adapted to successfully work on biological data. Rather surprisingly, the use of data compression is pervasive in computational biology. Starting from…

Biological dataData Compression Theory and Practice Alignment-free sequence comparison Entropy Huffman coding Hidden Markov Models Kolmogorov complexity Lempel–Ziv compressors Minimum Description Length principle Pattern discovery in bioinformatics Reverse engineering of biological networks Sequence alignmentSettore INF/01 - InformaticaGeneral Computer ScienceKolmogorov complexityComputer scienceSearch engine indexingComputational biologyInformation theoryInformation scienceTheoretical Computer ScienceTechnical PresentationEntropy (information theory)Data compressionComputer Science Review
researchProduct

Algorithmic Information Theory and Computational Complexity

2013

We present examples where theorems on complexity of computation are proved using methods in algorithmic information theory. The first example is a non-effective construction of a language for which the size of any deterministic finite automaton exceeds the size of a probabilistic finite automaton with a bounded error exponentially. The second example refers to frequency computation. Frequency computation was introduced by Rose and McNaughton in early sixties and developed by Trakhtenbrot, Kinber, Degtev, Wechsung, Hinrichs and others. A transducer is a finite-state automaton with an input and an output. We consider the possibilities of probabilistic and frequency transducers and prove sever…

Discrete mathematicsAverage-case complexityAlgorithmic information theoryTheoryofComputation_COMPUTATIONBYABSTRACTDEVICESKolmogorov complexityDescriptive complexity theoryComputational physicsStructural complexity theoryTheoryofComputation_MATHEMATICALLOGICANDFORMALLANGUAGESDeterministic finite automatonAsymptotic computational complexityComputer Science::Formal Languages and Automata TheoryComputational number theoryMathematics
researchProduct

Lower space bounds for randomized computation

1994

It is a fundamental problem in the randomized computation how to separate different randomized time or randomized space classes (c.f., e.g., [KV87, KV88]). We have separated randomized space classes below log n in [FK94]. Now we have succeeded to separate small randomized time classes for multi-tape 2-way Turing machines. Surprisingly, these “small” bounds are of type n+f(n) with f(n) not exceeding linear functions. This new approach to “sublinear” time complexity is a natural counterpart to sublinear space complexity. The latter was introduced by considering the input tape and the work tape as separate devices and distinguishing between the space used for processing information and the spa…

Discrete mathematicsCombinatoricsTuring machinesymbols.namesakeSublinear functionKolmogorov complexitysymbolsType (model theory)Binary logarithmSpace (mathematics)Time complexityWord (computer architecture)Mathematics
researchProduct

Application of kolmogorov complexity to inductive inference with limited memory

1995

A b s t r a c t . We consider inductive inference with limited memory[l]. We show that there exists a set U of total recursive functions such that U can be learned with linear long-term memory (and no short-term memory); U can be learned with logarithmic long-term memory (and some amount of short-term memory); if U is learned with sublinear long-term memory, then the short-term memory exceeds arbitrary recursive function. Thus an open problem posed by Freivalds, Kinber and Smith[l] is solved. To prove our result, we use Kolmogorov complexity.

Discrete mathematicsHardware_MEMORYSTRUCTURESKolmogorov complexityLogarithmSublinear functionKolmogorov structure functionChain rule for Kolmogorov complexityOpen problemInductive probabilityInductive reasoningMathematics
researchProduct

Amount of Nonconstructivity in Finite Automata

2009

When D. Hilbert used nonconstructive methods in his famous paper on invariants (1888), P.Gordan tried to prevent the publication of this paper considering these methods as non-mathematical. L. E. J. Brouwer in the early twentieth century initiated intuitionist movement in mathematics. His slogan was "nonconstructive arguments have no value for mathematics". However, P. Erdos got many exciting results in discrete mathematics by nonconstructive methods. It is widely believed that these results either cannot be proved by constructive methods or the proofs would have been prohibitively complicated. R.Freivalds [7] showed that nonconstructive methods in coding theory are related to the notion of…

Discrete mathematicsProbabilistic methodDeterministic finite automatonKolmogorov complexityIntuitionismLimit (mathematics)Mathematical proofConstructiveMethod of conditional probabilitiesMathematics
researchProduct

Finite State Transducers with Intuition

2010

Finite automata that take advice have been studied from the point of view of what is the amount of advice needed to recognize nonregular languages. It turns out that there can be at least two different types of advice. In this paper we concentrate on cases when the given advice contains zero information about the input word and the language to be recognized. Nonetheless some nonregular languages can be recognized in this way. The help-word is merely a sufficiently long word with nearly maximum Kolmogorov complexity. Moreover, any sufficiently long word with nearly maximum Kolmogorov complexity can serve as a help-word. Finite automata with such help can recognize languages not recognizable …

Discrete mathematicsTheoretical computer scienceNested wordKolmogorov complexityComputer scienceComputer Science::Computation and Language (Computational Linguistics and Natural Language and Speech Processing)Nondeterministic algorithmTheoryofComputation_MATHEMATICALLOGICANDFORMALLANGUAGESDeterministic finite automatonKolmogorov structure functionProbabilistic automatonQuantum finite automataNondeterministic finite automatonComputer Science::Formal Languages and Automata Theory
researchProduct

Adaptive learning of compressible strings

2020

Suppose an oracle knows a string $S$ that is unknown to us and that we want to determine. The oracle can answer queries of the form "Is $s$ a substring of $S$?". In 1995, Skiena and Sundaram showed that, in the worst case, any algorithm needs to ask the oracle $\sigma n/4 -O(n)$ queries in order to be able to reconstruct the hidden string, where $\sigma$ is the size of the alphabet of $S$ and $n$ its length, and gave an algorithm that spends $(\sigma-1)n+O(\sigma \sqrt{n})$ queries to reconstruct $S$. The main contribution of our paper is to improve the above upper-bound in the context where the string is compressible. We first present a universal algorithm that, given a (computable) compre…

FOS: Computer and information sciencesCentroid decompositionGeneral Computer ScienceString compressionAdaptive learningKolmogorov complexityContext (language use)Data_CODINGANDINFORMATIONTHEORYString reconstructionTheoretical Computer ScienceCombinatoricsString reconstruction; String learning; Adaptive learning; Kolmogorov complexity; String compression; Lempel-Ziv; Centroid decomposition; Suffix treeSuffix treeIntegerComputer Science - Data Structures and AlgorithmsOrder (group theory)Data Structures and Algorithms (cs.DS)Adaptive learning; Centroid decomposition; Kolmogorov complexity; Lempel-Ziv; String compression; String learning; String reconstruction; Suffix treeTime complexityComputer Science::DatabasesMathematicsLempel-ZivSettore INF/01 - InformaticaLinear spaceString (computer science)SubstringBounded functionString learningTheoretical Computer Science
researchProduct

All Classical Adversary Methods Are Equivalent for Total Functions

2017

We show that all known classical adversary lower bounds on randomized query complexity are equivalent for total functions and are equal to the fractional block sensitivity fbs( f ). That includes the Kolmogorov complexity bound of Laplante and Magniez and the earlier relational adversary bound of Aaronson. This equivalence also implies that for total functions, the relational adversary is equivalent to a simpler lower bound, which we call rank-1 relational adversary. For partial functions, we show unbounded separations between fbs( f ) and other adversary bounds, as well as between the adversary bounds themselves. We also show that, for partial functions, fractional block sensitivity canno…

FOS: Computer and information sciencesKolmogorov complexity010102 general mathematicsBlock (permutation group theory)0102 computer and information sciencesFunction (mathematics)Computational Complexity (cs.CC)Adversary01 natural sciencesUpper and lower boundsTheoretical Computer ScienceCombinatoricsComputer Science - Computational ComplexityComputational Theory and Mathematics010201 computation theory & mathematicsPartial functionSensitivity (control systems)0101 mathematicsEquivalence (measure theory)MathematicsACM Transactions on Computation Theory
researchProduct

Amount of nonconstructivity in deterministic finite automata

2010

AbstractWhen D. Hilbert used nonconstructive methods in his famous paper on invariants (1888), P. Gordan tried to prevent the publication of this paper considering these methods as non-mathematical. L.E.J. Brouwer in the early twentieth century initiated intuitionist movement in mathematics. His slogan was “nonconstructive arguments have no value for mathematics”. However, P. Erdös got many exciting results in discrete mathematics by nonconstructive methods. It is widely believed that these results either cannot be proved by constructive methods or the proofs would have been prohibitively complicated. The author (Freivalds, 2008) [10] showed that nonconstructive methods in coding theory are…

General Computer ScienceKolmogorov complexityKolmogorov complexityMathematical proofConstructiveTheoretical Computer ScienceAlgebraDeterministic finite automatonProbabilistic methodIntuitionismDeterministic automatonNonconstructive methodsCalculusFinite automataMethod of conditional probabilitiesMathematicsComputer Science(all)Theoretical Computer Science
researchProduct

Comparison of genomic sequences clustering using Normalized Compression Distance and Evolutionary Distance

2008

Genomic sequences are usually compared using evolutionary distance, a procedure that implies the alignment of the sequences. Alignment of long sequences is a long procedure and the obtained dissimilarity results is not a metric. Recently the normalized compression distance was introduced as a method to calculate the distance between two generic digital objects, and it seems a suitable way to compare genomic strings. In this paper the clustering and the mapping, obtained using a SOM, with the traditional evolutionary distance and the compression distance are compared in order to understand if the two distances sets are similar. The first results indicate that the two distances catch differen…

Kolmogorov complexityuniversal similarity metricComputer sciencebusiness.industryDNA sequencePattern recognitionGenomic Sequence ClusteringCompression (functional analysis)Normalized compression distanceArtificial intelligenceCluster analysisbusinessDistance matrices in phylogenyclustering
researchProduct